Differentially-Private Orthogonal Tensor Decomposition

نویسندگان

  • Hafiz Imtiaz
  • Anand D. Sarwate
چکیده

Differential privacy has recently received a significant amount of research attention for its robustness against known attacks. Decomposition of tensors has applications in many areas including signal processing, machine learning, computer vision and neuroscience. In this paper, we particularly focus on differentially-private orthogonal decomposition of symmetric tensors that arise in several latent variable models and propose two new algorithms. We compare the performance empirically with a recently proposed algorithm as well as a non-private algorithm. We investigate the performance of these algorithms with varying privacy parameters and database parameters. We show that our proposed algorithms provide very good utility even while preserving strict privacy guarantees.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Online and Differentially-Private Tensor Decomposition

Tensor decomposition is an important tool for big data analysis. In this paper,we resolve many of the key algorithmic questions regarding robustness, memoryefficiency, and differential privacy of tensor decomposition. We propose simplevariants of the tensor power method which enjoy these strong properties. We presentthe first guarantees for online tensor power method which has a...

متن کامل

Orthogonal Tensor Decomposition

In symmetric tensor decomposition one expresses a given symmetric tensor T a sum of tensor powers of a number of vectors: T = v⊗d 1 + · · · + v ⊗d k . Orthogonal decomposition is a special type of symmetric tensor decomposition in which in addition the vectors v1, ..., vk are required to be pairwise orthogonal. We study the properties of orthogonally decomposable tensors. In particular, we give...

متن کامل

Robust polynomial time tensor decomposition

Tensor decomposition has recently become an invaluable algorithmic primitive. It has seen much use in new algorithms with provable guarantees for fundamental statistics and machine learning problems. In these settings, some low-rank k-tensor A ∑r i 1 a ⊗k i which wewould like to decompose into components a1, . . . , ar ∈ ’n is often not directly accessible. This could happen for many reasons; a...

متن کامل

Symmetric Orthogonal Tensor Decomposition is Trivial

We consider the problem of decomposing a real-valued symmetric tensor as the sum of outer products of real-valued, pairwise orthogonal vectors. Such decompositions do not generally exist, but we show that some symmetric tensor decomposition problems can be converted to orthogonal problems following the whitening procedure proposed by Anandkumar et al. (2012). If an orthogonal decomposition of a...

متن کامل

A Counterexample to the Possibility of an Extension of the Eckart-Young Low-Rank Approximation Theorem for the Orthogonal Rank Tensor Decomposition

Earlier work has shown that no extension of the Eckart–Young SVD approximation theorem can be made to the strong orthogonal rank tensor decomposition. Here, we present a counterexample to the extension of the Eckart–Young SVD approximation theorem to the orthogonal rank tensor decomposition, answering an open question previously posed by Kolda [SIAM J. Matrix Anal. Appl., 23 (2001), pp. 243–355].

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016